专利摘要:
A mobile terminal (100) including wireless communication (110) configured to provide wireless communication; a touch screen ; and a controller (180) configured to display on the touch screen an input region (410) including a plurality of character keys and a plurality of edit keys (411a, 411b, 421a, 421b) and a output region (310), displaying on the touch screen an editing status display region (510) between the input region (410) and the output region (310) and for displaying a word corresponding to affected character keys, selecting or editing the word displayed in the editing state display region (510) based on a touch input applied to the input region (410), and displaying the selected or edited word on the output region (310).
公开号:FR3021136A1
申请号:FR1554257
申请日:2015-05-12
公开日:2015-11-20
发明作者:Sangouk Park;Youngmin Yoon;Hyunjoo Jeon;Bumhee Han
申请人:LG Electronics Inc;
IPC主号:
专利说明:

[0001] The present invention relates to a mobile terminal including a display unit and a corresponding method for outputting characters by a touch input.
[0002] Terminals can be broadly classified as mobile / portable terminals or fixed terminals. Mobile terminals can also be classified as handheld terminals or vehicle-mounted terminals. As the terminal functions become more diverse, the terminal can support more complicated functions such as capturing images or video, playing music or video files, playing games, receiving broadcast signals and the like. . By implementing such functions exhaustively and collectively, the mobile terminal can be realized in the form of a multimedia player or a multimedia device. The mobile terminals include a touch sensor for receiving a touch input and outputting a virtual keyboard having a plurality of keys on a region of a display unit. When a touch input is applied to a virtual keyboard, a character is output on a differentiated region of the virtual keyboard. In this case, the user has to check an output region in which a character is output, in order to check the character while applying a touch input to the virtual keyboard, causing inconvenience to the user. Accordingly, one aspect of the detailed description is to provide a display unit that allows a user to enter a character when applying a touch input to a virtual keyboard without moving his or her eyes to verify a output region.
[0003] To achieve these and other advantages and in accordance with the purpose of this description, as incorporated and broadly described herein, the present invention provides a mobile terminal including: a touch screen including an input region including a plurality of character keys and a plurality of edit keys and an output region; and a control unit configured to output an edit state display region including a word corresponding to character keys based on touch inputs applied to the character keys between the input region and the region in which, based on a touch input applied to the input region, the control unit selects or edits the word in the edit state display region, and controls the touch screen to display the selected word or the edited word on the output region. In another aspect, the present invention provides a method of controlling a mobile terminal including a touch screen divided into an output region and an input region including a plurality of character keys and a plurality of edit keys. The method includes outputting a word on an edit state display region between the input region and the output region based on a touch input applied to the character keys; editing or selecting the word in the edit state display region based on a touch input applied to a portion of the plurality of character keys; and displaying the word on the output region based on a touch input applied to a portion of the plurality of character keys. A broader scope of applicability of the present application will be apparent from the detailed description given below. Nevertheless, it is to be understood that the detailed description and the specific examples, while indicating preferred embodiments of the invention, are given by way of illustration only, as various changes and modifications in the spirit and scope of the invention are made. The invention will become more apparent to those skilled in the art from the detailed description.
[0004] The accompanying drawings, which are included to provide a better understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments and, together with the description, serve to explain the principles of the invention. In the drawings: Fig. 1A is a block diagram of a mobile terminal according to an embodiment of the present invention. Figures 1B and 1C are conceptual views of an example of the mobile terminal seen from different directions. Fig. 2 is a flowchart illustrating a control method according to an embodiment of the present invention. Figs. 3A (a) to 3B (c) are conceptual views illustrating a control method of Fig. 2.
[0005] Figs. 4A to 4F (c) are conceptual views illustrating a method of controlling change of an input position in an edit state display region. Figs. 5A and 5B are conceptual views illustrating an erase control method of a word output on an edit state display region according to various embodiments of the present invention. Figs. 6 (a) through 6 (d) are conceptual views illustrating a control method of controlling an output position of a word based on a touch input in an edit state display region .
[0006] Figs. 7A to 7E (d) are conceptual views illustrating a control method of entering a word on an input region. Fig. 8 is a conceptual view illustrating a control method of clearly displaying a word based on a touch input applied to an input region in an edit state display region.
[0007] Figs. 9A-9D (d) are conceptual views illustrating screen information outputted to an input region. Figs. 10A to 10C (b) are conceptual views illustrating a control method using screen information outputted to an input region. Figs. 11A (a) to 11B (b) are conceptual views illustrating a control method of changing a size of a virtual keyboard according to a user's input state. Figs. 12A and 12B are conceptual views illustrating a control method of outputting notification information based on a touch input applied to an input region.
[0008] Figs. 13 (a) and 13 (b) are conceptual views illustrating a control method of analyzing information included in an input region and outputting a recommendation word. We will now give a detailed description of the embodiments disclosed herein with reference to the accompanying drawings. For a brief description with reference to the drawings, the same or equivalent components may be provided with like numerals or like reference numerals, and their description will not be repeated. In general, a suffix such as "module" and "unit" can be used to designate elements or components. The use of such a suffix is here merely meant to facilitate the description of the memoir, and the suffix itself is not meant to give any special meaning or function. The accompanying drawings are used to aid in easily understanding various technical features and it should be understood that the embodiments presented herein are not limited to the accompanying drawings. As such, the present invention should be construed as extending to all modifications, all equivalents and substitutes in addition to those specifically specified in the accompanying drawings. Although the terms first, second, etc. can be used here to describe various elements, these elements should not be limited by these terms. These terms are usually only used to distinguish one element from another. When an element is designated as "connected to" another element, the element may be connected to the other element or intervening elements may also be present. In contrast, when an element is designated as being "directly connected to" another element, no intervening element is present.
[0009] A singular representation may include a plural representation unless it represents an absolutely different meaning of the context. Terms such as "include" or "have" are used here and should be understood to mean the existence of more than one component, function or step, disclosed in the factum, and it is also understood that the same can be use components, functions, or steps in larger or smaller numbers. The mobile terminals presented here can be implemented using a variety of different types of terminals. Examples of such devices include cell phones, smart phones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, laptops (PC), slate PCs, tablet PCs, ultrabooks, clothing devices (e.g., smart watches, smart glasses, headphones (HMD)), and the like.
[0010] As a nonlimiting example only, a further description will be given with reference to particular types of mobile terminals. Nevertheless, these teachings also apply to other types of terminals, such as types noted above. In addition, these teachings can also be applied to fixed terminals such as digital TV, desktops, and the like. Reference is now made to FIGS. 1A to 1C, where FIG. 1A is a block diagram of a mobile terminal in accordance with the present invention, and FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, seen from different directions. The mobile terminal 100 is shown having components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180 and a power source unit 190. It is to be understood that the implementation of all the illustrated components is not a requirement, and that alternative larger or larger components may be implemented as an alternative. small number. Referring now to Figure 1A, the mobile terminal 100 is shown having a wireless communication unit 110 configured with a plurality of jointly implemented components. For example, the wireless communication unit 110 typically includes one or more components that allow wireless communication between the mobile terminal 100 and a wireless communication system or network within which the mobile terminal is located. The wireless communication unit 110 typically includes one or more modules that enable communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 typically includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 includes one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, and a location information module 115.
[0011] The input unit 120 includes a camera 121 for obtaining images or video, a microphone 122, which is a type of audio input device for inputting an audio signal, and an input unit user 123 (for example, a touch key, a push button, a mechanical key, a function key and the like) allowing a user to enter information. Data (eg audio, video, image and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user commands and combinations thereof .
[0012] The detection unit 140 is typically implemented using one or more sensors configured to detect internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information and the like. For example, in FIG. 1A, the detection unit 140 is shown having a proximity sensor 141 and an illumination sensor 142.
[0013] If desired, the detection unit 140 may optionally or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a sensor G, a sensor a gyroscope, a motion sensor, an RGB sensor, an infrared (IR) sensor, a finger reader sensor, an ultrasonic sensor, an optical sensor (for example a camera 121), a microphone 122, a battery meter, a an environmental sensor (for example a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor and a gas sensor, among others), and a chemical sensor (for example an electronic nose, a sensor, health care, a biometric sensor and the like) just to name a few. The mobile terminal 100 may be configured to use information obtained from the detection unit 140, and in particular, information obtained from one or more sensors of the detection unit 140, and combinations thereof. Output unit 150 is typically configured to output various types of information, such as audio, video, touch, and the like. The output unit 150 is shown having a display unit 151, an audio output unit 152, a haptic module 153, and an optical output module 154. The display unit 151 may have an interlayer structure or a structure integrated with a touch sensor to facilitate a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as operate as the user input unit 123 which provides an input interface between the mobile terminal 100 and the user.
[0014] The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, can include any one of wired or wireless ports, external power source ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio I / O ports , video I / O ports, earphone ports, and the like. In some cases, the mobile terminal 100 may perform various control functions associated with a connected external device, in response to the connection of the external device to the interface unit 160.
[0015] The memory 170 is typically implemented to store data to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions. for operations of the mobile terminal 100 and the like. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed within the mobile terminal 100 at the time of manufacture or shipment, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call, make a call, receive a message, send a message and the like). It is common for the application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. The controller 180 typically operates to control the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are inputted or outputted by the various components shown in FIG. 1A, or activate programs. in the memory 170. By way of example, the control member 180 controls all or part of the components illustrated in FIGS. 1A to 1C according to the execution of an application program which has been stored in the memory 170.
[0016] The power source unit 190 may be configured to receive an external power supply or provide an internal power supply to provide an appropriate power required to operate the elements and components included in the mobile terminal 100. The power source unit 190 may include a battery, and the battery may be configured to be embedded in the terminal body, or configured to be detachable from the terminal body. Referring again to FIG. 1A, various components shown in this figure will now be described in greater detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and / or information associated with broadcasting from an external broadcast management entity over a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate simultaneous reception of two or more broadcast channels, or support switching among broadcast channels. The broadcast management entity may be implemented using a server or system that generates and transmits a broadcast signal and / or information associated with broadcasting, or a server that receives a pre-generated broadcast signal and / or related information to broadcasting, and sends such items to the mobile terminal. The broadcast signal may be implemented using any of a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and combinations thereof, among others. The broadcast signal may in some cases also include a data broadcast signal combined with a TV or radio broadcast signal. The broadcast signal may be encoded according to any of a variety of technical standards or broadcasting processes (eg the International Organization for Standardization (IS 0), the International Electrotechnical Commission (IEC), digital video broadcasting ( DVB), the Advanced Television Systems Committee (ATSC), and the like) for transmitting and receiving digital broadcast signals. The broadcast receiving module 111 may receive the digital broadcast signals using a method appropriate to the transmission method used. Examples of information associated with broadcasting may include information associated with a broadcast channel, a broadcast program, a broadcast event, a broadcast service provider, or the like. The information associated with broadcasting can also be provided via a mobile communication network, and in this case received by the mobile communication module 112. The information associated with broadcasting can be implemented in a variety of formats. For example, the information associated with the broadcast may include an electronic program guide (EPG) for digital multimedia broadcasting (DMB), an electronic service guide (ESG) for digital video-portable broadcasting (DVB-H) and the like. The broadcast signals and / or information associated with the broadcast received via the broadcast receiving module 111 may be stored in a suitable device, such as a memory 160. The mobile communication module 112 may transmit and / or receive signals wirelessly to and from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. Such network entities are part of a wireless communication network, which is built according to technical standards or communication methods for mobile communications (for example, the global mobile communications system (GSM), multiple access code division (CDMA), CDMA2000 (Multi Access Division Code 2000), EV-DO (enhanced and optimized voice / data communication or enhanced voice / data communication only), broadband CDMA (WCDMA), access by High-speed downlink packets (HSDPA), HSUPA (high-speed uplink packet access), long-term evolution (LTE), LTE-A (enhanced long-term evolution), and the like). Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audio call signals, video call signals (telephony), or various data formats to support text and multimedia message communication.
[0017] The wireless Internet module 113 is configured to facilitate wireless Internet access. This module can be coupled to the mobile terminal 100 internally or externally. The wireless Internet module 113 can transmit and / or receive wireless signals via communication networks according to wireless Internet technologies.
[0018] Examples of such wireless Internet access include Wireless Local Area Network (WLAN), Wi-Fi (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), HSUPA (High Speed Uplink Packet Access) Long Term Evolution (LTE) ), LTE-A (enhanced long-term evolution), and the like. The wireless Internet module 113 can transmit / receive data according to one or more of these wireless Internet technologies, as well as other Internet technologies. In some embodiments, when wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a network of mobile communication, the wireless Internet module 113 achieves such access to wireless Internet. As such, the wireless Internet module 113 can cooperate with, or function as, the mobile communication module 112.
[0019] The short-range communication module 114 is configured to facilitate short-range communications. Suitable technologies to implement such short-range communications include BLUETOOTHTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication ( NFC), Wi-Fi (Wi-Fi), Wi-Fi Direct, Wireless USB (Universal Serial Bus), and the like. The short-range communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal and the mobile terminal. a network where another mobile terminal 100 (or an external server) is located, via wireless local area networks. An example of wireless local area networks is a personal wireless LAN. In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a clothing device, for example a smart watch, smart glasses or a headset (HMD), which can exchange data. with the mobile terminal 100 (or otherwise cooperate with the mobile terminal 100). The short-range communication module 114 can detect or recognize the clothing device, and allow communication between the clothing device and the mobile terminal 100. In addition, when the detected clothing device is a device that is authenticated to communicate with the mobile terminal 100, the controller 180, for example, can cause processed data to be transmitted in the mobile terminal 100 to the clothing device via the short-range communication module 114. From there, a user of the clothing device can use the processed data. in the mobile terminal 100 on the clothing device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the clothing device. Similarly, when a message is received in the mobile terminal 100, the user can check the received message using the clothing device. The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. For example, the location information module 115 includes a global positioning system (GPS) module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal. For example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired by using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information relating to a wireless access point (AP) which transmits or receives a signal wirelessly to or from the Wi-Fi module. The input unit 120 may be configured to allow various types of input to the mobile terminal 120. Examples of such inputs include audio, picture, video, data and user. An image and video input is often obtained using one or more cameras 121. Such cameras 121 may process still image or video image frames obtained by image sensors in a video capture mode or image. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to allow a plurality of images having various angles. or focal points to be entered in the mobile terminal 100. As another example, the cameras 121 may be located in a stereoscopic arrangement for acquiring left and right images for implementing a stereoscopic image. The microphone 122 is generally implemented to allow audio input to the mobile terminal 100. The audio input may be processed in various ways according to a function performed in the mobile terminal 100. If desired, the microphone 122 may include speech algorithms. Various noise suppression to suppress unwanted noise generated during reception of external audio. The user input unit 123 is a component that allows input by a user. Such user input may allow the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (e.g. button, a button located on a front and / or rear surface or a side surface of the mobile terminal 100, a dome switch, a hand wheel, a hand switch and the like), or a touch sensitive input, among others. For example, the touch-sensitive input may be a virtual key or a function key, which is displayed on a touch screen by software processing, or a touch key that is located on the mobile terminal at a location that is other than on the touch screen. In addition, the virtual key or the visual key can be displayed on the touch screen in various forms, for example graphic, text, icon, video or one of their combinations. The detection unit 140 is generally configured to detect one or more of internal information of the mobile terminal, information of the surrounding environment of the mobile terminal, user information or the like. The controller 180 generally cooperates with the sending unit 140 to control the operation of the mobile terminal 100, or to execute a data processing, function, or operation associated with an application program installed in the mobile terminal. based on the detection provided by the detection unit 140. The detection unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail. The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using an electromagnetic field, infrared rays, or the like without mechanical contact. The proximity sensor 141 may be arranged at an inner region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141 may for example include any one of a transmissive sensor type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflecting type of photoelectric sensor, a proximity sensor to high frequency oscillation, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen is implemented as a capacitance type, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in an electromagnetic field, which responds to the approach of a object having a conductivity. In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor.
[0020] The term "proximity touch" will often be used here to refer to the scenario in which a pointer is positioned to be close to the touch screen without coming into contact with the touch screen. The term "touch contact" will often be used here to refer to the scenario in which a pointer makes physical contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. The proximity sensor 141 can detect a proximity touch, and proximity touch patterns (e.g., distance, direction, speed, time, position, travel status, and the like).
[0021] In general, the controller 180 processes data corresponding to proximity taps and proximity tap patterns detected by the proximity sensor 141, and causes visual information output to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data depending on whether a touch with respect to a point on the touch screen is either a proximity touch or a touch contact. . A touch sensor can detect a touch applied to the touch screen, such as the display unit 151, using any of a variety of touching methods. Examples of such touch methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others. As an example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or to convert a capacitance occurring at a specific portion of the display unit 151. , as electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also touch pressure and / or touch ability. A touch object is generally used to apply a touch input to the touch sensor.
[0022] Examples of typical touch objects include a finger, a pencil, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals and then transmit corresponding data to the controller 180. Accordingly, the controller 180 can detect which region of the display unit 151 has been affected. Here, the touch control member may be a separate component of the controller 180, combined with the controller 180, and combinations thereof. In some embodiments, the controller 180 may perform the same or different commands depending on a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. It can be decided whether to execute the same command or a different command depending on the object that provides tactile input based on a current state of operation of the mobile terminal 100 or an application program currently running, for example. The touch sensor and the proximity sensor can be implemented individually, or in combination, to detect various types of touch. Such touches include a short touch, a long touch, a multi-touch, a sliding feel, a light touch, a pinch to the inside, a pinch to the outside, a touch touch by scanning, a pointing finger and the like. If desired, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object using ultrasonic waves. The controller 180 may for example calculate a position of a wave generation source based on information detected by an illumination sensor and a plurality of ultrasonic sensors. Since the light is much faster than the ultrasonic waves, the time required for the light to reach the optical sensor is much shorter than the time required for the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. For example, the position of the wave generation source can be calculated using the time difference from the time required for the ultrasonic wave to reach the sensor based on light as a reference signal. The camera 121 typically includes at least one camera sensor (CCD, CMOS, etc.), a photosensitive sensor (or image sensors) and a laser sensor. The implementation of the camera 121 with a laser sensor can allow the detection of a touch of a physical object with respect to a stereoscopic image in 3D. The photosensitive sensor may be laminated on, or overlapped by, the display device. The photosensitive sensor may be configured to scan the movement of the physical object near the touch screen. In more detail, the photosensitive sensor may include photodiodes and transistors at rows and columns for scanning the content received at the photosensitive sensor using an electrical signal that changes depending on the amount of light applied. Namely, the photosensitive sensor can calculate the coordinates of the physical object according to a variation of light to thereby obtain position information of the physical object. The display unit 151 is generally configured to output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program. executing at the mobile terminal 100 or the user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display scheme such as a stereoscopic scheme (a scheme with glasses), an autostereoscopic scheme (a scheme without glasses), a projection scheme (holographic scheme) or the like. The audio output unit 152 is generally configured to output audio data. Such audio data can be obtained from any one of a number of different sources, so that the audio data can be received from the wireless communication unit 110 or may have been stored in memory 170. Audio data may be output during modes such as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode and the like. The audio output unit 152 may provide an audible output relating to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. audio output unit 152 may also be implemented as a receiver, a speaker, a buzzer or the like. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives or otherwise experiences. A typical example of a tactile effect generated by the haptic module 153 is a vibration. The intensity, pattern, and the like of the vibration generated by the haptic module 153 may be controlled by a user selection or by adjustment by the control unit. For example, the haptic module 153 can output different vibrations in a combined or sequential manner. In addition to the vibration, the haptic module 153 can generate various other tactile effects, including a stimulating effect such as a vertically moving spindle arrangement for contacting the skin, a spraying force, or an air suction force through a jet orifice or suction opening, a skin feel, electrode contact, electrostatic force, an effect that reproduces the sensation of cold and heat by using an element that can absorb or generate heat , and the like. The haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscular sensation such as the fingers or the arm of the user, as well as by transferring the tactile effect by direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal for indicating event generation using the light of a light source. Examples of events generated in the mobile terminal 100 may include message reception, call waiting, missed call, alarm, calendar announcement, e-mail reception, receipt of information to through an application, and the like. A signal outputted from the optical output module 154 may be implemented for the mobile terminal to emit monochromatic light or light of a plurality of colors. The signal output may be terminated when the mobile terminal detects that a user has verified the generated event, for example. The interface unit 160 serves as an interface for external devices to connect to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted from an external device, receive the power for transfer to elements and components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headphone ports, external power source ports, wired or wireless data ports, memory card ports, ports for connection of a device having an identification module, audio input / output (I / O) ports, video I / O ports, headphone ports, or the like. The identification module may be a chip that stores various information to authenticate the right of use of the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM) ), a universal subscriber identity module (USIM) and the like. In addition, the device comprising the identification module (also referred to herein as "identifying device") can take the form of a smart card. As a result, the identifier device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external cradle, the interface unit 160 can be used as a passageway for providing the cradle. feeding the cradle to the mobile terminal 100 or can be used as a passage to allow to transfer various order signals entered by the user from the cradle to the mobile terminal. Various command signals or the power input from the cradle may function as signals to recognize that the mobile terminal is properly mounted on the cradle.
[0023] The memory 170 may store programs to support operations of the controller 180 and store input / output data (eg, phone book, messages, still images, videos, etc.). The memory 170 can store data relating to various vibration and audio patterns that are output in response to touch inputs on the touch screen.
[0024] The memory 170 may include one or more types of storage media including a flash memory, a hard disk, a semiconductor disk, a silicon disk, a micro type of multimedia card, a memory card (for example memory SD or DX, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), memory magnetic disk, a magnetic disk, an optical disk and the like. The mobile terminal 100 can also be operated in connection with a network storage device that performs the storage function of the memory 170 on a network, such as the Internet.
[0025] The controller 180 may typically control the general operations of the mobile terminal 100. For example, the controller 180 may set or release a lockout state to restrict a user's entry of a control order relative to to applications when a mobile terminal status satisfies a pre-established condition.
[0026] The controller 180 may also perform the control and processing associated with voice calls, data communications, video calls and the like, or perform pattern recognition processing to recognize a handwritten input or a drawn input made on the touch screen in the form of characters (or word) or images, respectively. In addition, the controller 180 may control a component or combination of these components to implement various embodiments disclosed herein. The power source unit 190 receives an external power supply or provides an internal power supply and provides the appropriate power required to operate the respective elements and components included in the mobile terminal 100. The power source unit 190 may include a battery, which is typically rechargeable or is detachably coupled to the terminal body for charging.
[0027] The power source unit 190 may include a connection port. The connection port may be configured as an example of the interface unit 160 to which an external charger to provide power to recharge the battery is electrically connected. As another example, the power source unit 190 may be configured to recharge the wireless battery without the use of the connection port. In this example, the power source unit 190 can receive power, transferred from an external wireless power transmitter, using one of an inductive coupling method that is based on magnetic induction. or a magnetic resonance coupling method that is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer readable medium, a machine readable medium or a similar medium using, for example, software, hardware, or a combination thereof. Referring now to FIGS. 1B and 1C, the mobile terminal 100 is described with reference to a body of the bar-type terminal. However, the mobile terminal 100 may alternatively be implemented in any of a variety of different configurations. Examples of such configurations include watch type, fastener type, spectacle type, or as foldable type, flap type, sliding type, swing type, and swivel type in which two and more than two bodies are combined with each other. others relatively movably, and their combinations. The present discussion is often concerned with a particular type of mobile terminal (for example, a bar type, a watch type, a spectacle type and the like).
[0028] Nevertheless, such teachings as to a particular type of mobile terminal will also generally apply to other types of mobile terminals. The mobile terminal 100 will generally include a housing (for example a frame, a housing, a shell and the like) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a space formed between the front housing 101 and the rear housing 102. At least furthermore, a middle case can be positioned between the front case 101 and the rear case 102. The display unit 151 is shown on the front side of the terminal body for outputting information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, electronic components may also be The examples of such electronic components include a detachable battery 191, an identification module, a memory card and the like. The back shell 103 is shown covering the electronic components, and this shell can be detachably coupled to the back box 102. As a result, when the back shell 103 is detached from the back box 102, the electronic components mounted on the back box 102 are exposed to the outdoors. As illustrated, when the rear shell 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially exposed. In some cases, during coupling, the rear case 102 may be completely obscured by the back shell 103. In some embodiments, the back shell 103 may include an opening for exposing a 121b camera or unit to the outside. 152b audio output. The housings 101, 102, 103 may be formed by injection molding of a synthetic resin or may be formed of a metal, for example stainless steel (AI), aluminum (Al), titanium (Ti) or similar. As an alternative to the example in which the plurality of housings form an internal space for housing components, the mobile terminal 100 may be configured such that a housing forms the internal space. In this example, a mobile terminal 100 having a monocoque is formed so that a synthetic resin or metal extends from a side surface to a back surface. If desired, the mobile terminal 100 may include a water sealing unit to prevent the introduction of water into the body of the terminal.
[0029] For example, the water sealing unit may include a watertight member that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the back shell 103, for hermetically sealing an internal space when these housings are coupled.
[0030] The mobile terminal 100 may include the display unit 151, first and second audio output units 152a and 152b, the proximity sensor 141, an illumination sensor 142, an optical output module 154, first and second cameras 121a and 121b, first and second handling units 123a and 123b, a microphone 122, an interface unit 160, and the like.
[0031] Hereinafter, as illustrated in FIGS. 1B and 1C, the mobile terminal 100 in which the display unit 151, the first audio output unit 152a, the proximity sensor 141, the illumination sensor 142, the optical output module 154, the first camera 121a and the first handling unit 123a are disposed on a front surface of the terminal body, the second handling unit 123b, the microphone 122 and the interface unit 160 are arranged on the the terminal body side, and the second audio output unit 152b and the second camera 121b are disposed on a rear surface of the terminal body will be described by way of example. However, the components are not limited to the configuration. Components may be excluded, replaced or placed on other surfaces if necessary. For example, the first handling unit 123a may not be provided on the front surface of the terminal body, and the second audio output unit 152b may be provided on the side of the terminal body rather than on the back surface of the body. of the terminal. The display unit 151 may display (or output) processed information in the mobile terminal 100. For example, the display unit 151 may display screen information executed from a trained application program in the display. mobile terminal 100, or user interface information (UI) or graphical user interface (GUI) information according to the screen information executed. The display unit 151 may include a liquid crystal display (LCD), a thin-film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED), a flexible display, a display 3 dimensions (3D) and an electronic ink display. The display unit 151 may be implemented using two display devices, which may implement the same display technology or a different technology. For example, a plurality of display units 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces. The display unit 151 may also include a touch sensor that detects a touch input received at the display unit. When a touch is input to the display unit 151, the touch sensor may be configured to detect that touch and the controller 180 may for example generate a control command or other signal corresponding to the touch. Content that is entered by touch can be a text or numeric value, or a menu item that can be specified or designated in various modes. The touch sensor may be configured as a film having a touch pattern, disposed between the window 151a and a display on a rear surface of the window 151a, or a metal wire that is patterned directly on the back surface of the window. the window 151a. Alternatively, the touch sensor can be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or within the display.
[0032] The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch sensor can serve as a user input unit 123 (see Figure 1A). As a result, the touch screen can replace at least a portion of the functions of the first handling unit 123a. The first audio output module 152a can be implemented as a receiver and the second audio output unit 152b can be implemented as a speaker to output voice audio, alarm sounds. , a multimedia audio reproduction and the like.
[0033] The window 151a of the display unit 151 will typically include an aperture to allow the audio generated by the first audio output unit 152a to pass. An alternative is to allow the release of audio along an assembly gap between the structural bodies (for example a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or otherwise obscured in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may be configured to exit the light to indicate the generation of an event. Examples of such events include message reception, call waiting reception, missed call, alarm, calendar announcement, email reception, information receipt through an application, and the like. When a user has verified a generated event, the control unit can control the optical output module 154 to stop the light output.
[0034] The first camera 121a can process image frames as well as still or moving images obtained by the image sensor in a capture mode or a video call mode. The processed image frames may then be displayed on the display unit 151 or stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123, which may be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as a handling portion, and may employ any touch method that allows the user to perform a manipulation such as touch, push, scroll, or the like. The first and second handling units 123a and 123b may also employ any non-tactile method that allows the user to perform manipulation such as proximity touch, pointing, or the like. Figure 1B illustrates the first handling unit 123a as a touch key, but possible variants include a mechanical key, a push button, a touch key, and combinations thereof. An input received at the first and second handling units 123a and 123b may be used in a variety of ways. For example, the first handling unit 123a may be used by the user to provide menu entry, a start, cancel, search, or the like key, and the second handling unit 123b may be used by the user for providing an input for controlling a volume level that is outputted by the first or second audio output unit 152a or 152b, to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide a start / stop input, start, end, scroll, volume level control output by the first or second audio output unit 152a or 152b , switching to a touch recognition mode of the display unit 151, and the like. The rear input unit can be configured to allow touch input, push input, or combinations thereof. The rear input unit may be located to overlap the display unit 151 on the front side in a direction of the thickness of the terminal body. For example, the rear input unit may be located on an upper end portion of the rear side of the terminal body so that a user can easily manipulate it using an index when the user grasps the body of the terminal with one hand. However, the present invention is not limited thereto and a position of the rear input unit can be changed. When the rear input unit is provided on the rear surface of the terminal body, a new user interface can be implemented. Also, when the touch screen or the rear input unit as described above replaces at least some functions of the first handling unit 123a on the front surface of the terminal body so that the first handling unit 123a is omitted from the front surface of the terminal body, the display unit 151 may have a larger screen. As a further alternative, the mobile terminal 100 may include a finger reader sensor that scans a user's fingerprint. The controller 180 can then use fingerprint information detected by the finger reader sensor as part of an authentication procedure. The finger reader sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown located at one end of the mobile terminal 100, but other locations are possible. . If desired, multiple microphones may be implemented, such an arrangement for receiving stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may include one or more of a connection terminal for connection to another device (e.g., a listener, an external speaker, or the like), a port for near-field communication (For example, an infrared data association port (IrDA), a Bluetooth port, a wireless LAN port, and the like), or a power source terminal for supplying power to the mobile terminal 100. interface 160 may be implemented in the form of a card for receiving an external card, such as a subscriber identification module (SIM), a user identity module (UIM), or a memory card for storing information. The second camera 121b is shown located at the rear side of the terminal body and includes an image capture direction that is substantially opposite to the image capture direction of the first camera unit 121a. The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. The cameras can be referred to as "network camera". When the second camera 121b is implemented as a networked camera, images can be captured in various ways using the plurality of lenses and images with better qualities. As shown in FIG. 1C, a flash 124 is shown adjacent to the second camera 121b. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject. As shown in FIG. 1C, the second audio output unit 152b may be located on the body of the terminal. The second audio output unit 152b may implement stereophonic sound functions together with the first audio output unit 152a, and may also be used to implement a speakerphone mode for call communication. At least one antenna for wireless communication may be located on the body of the terminal. The antenna can be installed in the body of the terminal or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the body of the terminal. Alternatively, an antenna may be formed using a film attached to an inner surface of the back shell 103, or a housing that includes a conductive material. A power source unit 190 for supplying power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or detachably coupled to an exterior of the terminal body. The battery 191 can receive power via a power source cable connected to the interface unit 160. Also, the battery 191 can be wirelessly recharged using a wireless charger. The wireless load can be implemented by magnetic induction or electromagnetic resonance. The back shell 103 is shown coupled to the back box 102 to mask the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external impact or a foreign object. When the battery 191 is detachable from the terminal body, the rear housing 103 may be detachably coupled to the rear housing 102. An accessory to protect an aspect or to assist or extend the functions of the mobile terminal 100 may also be arranged on the mobile terminal 100. As an example of an accessory, a shell or a case for covering or accommodating at least one surface of the mobile terminal 100 may be formed. The shell or holster can cooperate with the display unit 151 to extend the function of the mobile terminal 100. Another example of the accessory is a touch pen for assisting or extending a touch input to a terminal. touchscreen. The mobile terminal 100 according to one embodiment of the present invention includes an input function of a word based on a touch input applied to the display unit 151. In one embodiment of the present invention, while a user enters characters (word) by applying a touch input to a region of the display unit 151, an edit state of a word entered to be adjacent to the particular region to which the touch is applied is displayed. Hereinafter, there will be described a control method of outputting an edit state display region according to various embodiments of the present invention. The display unit is implemented as a touch screen that receives a touch input to control a mobile terminal. Thus, hereinafter, numeral 151 will be given to a touch screen and described. In particular, Fig. 2 is a flowchart illustrating a control method according to an embodiment of the present invention, and Figs. 3A (a) to 3B (c) are conceptual views illustrating a control method of Fig. 2. Referring to FIGS. 2 and 3A, the touch screen 151 includes an output region 310 and an input region 410. For example, when a memo application for entering and storing content such as a word , an image and the like, is executed, the input region 410 can be implemented as a virtual keyboard. The virtual keyboard may include a plurality of touch keys. The virtual keyboard 410 may include a plurality of character keys 412 corresponding to a plurality of interferable characters and a plurality of editing keys 411 receiving a touch input to control editing such as word deletion, change a line, a spacing, the change of a language of a character, and the like. Namely, the controller 180 displays a plurality of edit keys 411 and a plurality of character keys 412 on an input region 410 of the touch screen 151 in step S210. A word (characters) based on a touch input applied to the character key 412 of the virtual keyboard 410, and the like, may be outputted to the output region 310. Based on a touch input applied to the character key 412, the controller 180 may control the touch screen 151 to output a word corresponding to the character key 412, which has received the touch input, to the output region 310, but the present invention is not limited thereto . Namely, the controller 180 of the mobile terminal 100 according to one embodiment of the present invention can control the touch screen 151 to output the editing status display region 510 including the word corresponding to the key. character by a touch input applied to the character key 412 in step S220. On the touch screen 151, the edit state display region 510 may be displayed to be adjacent to the input region 310. For example, the edit state display region 510 may be disposed between the output region 310 and the input region 410. The output state display region 510 may have a bar shape extending in a direction of the width perpendicular to a direction of the length of the output. Mobile terminal 100. Hereinafter, the bar-like editing state display region 510 will be described, but the present invention is not limited thereto. The edit state display region 510 may be positioned above the virtual keyboard 410. Namely, the controller 180 controls the touch screen 151 to display a word in the display region of the display. real-time editing state 510 based on a touch input applied to the input region 310. Accordingly, the user can view and verify a word to output on the output region 310 in advance or in advance. real time, while applying a touch input to the input region 410. Also, when a touch input is applied to any one of the plurality of edit keys 411, the control member 180 controls the input region 410. touch screen 151 for outputting visual information based on the edit key 411 to which a touch input is applied to the edit status display region 510 in step S230. The controller 180 performs various editing functions according to types of the edit keys 411, and controls the touch screen 151 to output images representing the editing functions. Referring to (b) and (c) of Fig. 3A, based on a touch input applied to a first edit key 411a, the controller 180 controls the touch screen 151 to output a first word 501 displayed on the edit state display region 510, on the output region 310. For example, the first edit key 411a may correspond to a space bar. That is, when the first word 501 is output on the edit state display region 510, and when a touch input is applied to the first edit key 411a, the controller 180 outputs the first word 501 on the output region 310 and terminates the output of the first word in the edit state display region 510. After that, when a touch input is applied to the first edit key 411a, the control member 180 forms a space from the first word in the output region 310. Alternatively, based on a touch input applied to the first edit key 411a, an output position of the first word 501 is changed and a space can be formed from the first word 501. At the same time, when a touch input is applied to the first edit key 411a, the controller 180 controls the touch screen 151 to output the first information visual 501 '. Here, the first visual information 501 'can be an animation image of the first word 501 moving from the editing state display region 510 to the output region 310. For example, the animation image can be the first word 501 which is progressively moved to the output region 310 in the course of time and which is progressively modified to be decreased or increased in size. At the same time, referring to FIGS. 3A (b) and 3A (d), based on a touch input applied to a second preset edit key 411b, the controller 180 clears a portion of the first word 501 displayed in the edit state display region 510. Namely, the second edit key 411b may correspond to an input cancel key for detecting a word, or the like. When a touch input is applied to the second edit key 411b, the controller 180 controls the touch screen 151 to output second visual information 511 to the edit state display region 510. second visual information 511 may correspond to a preset form of the edit state display region 510. For example, the second visual information 511 may correspond to a shape, a color, and the like, forming the display region of the display region. for example, the second visual information 511 may include a shape of a line or a color forming the edge of a region in which the first word 501 is displayed or the display region of for example, the second visual information 511 may correspond to a line image formed in the edge region of the edit state display region 510 and having a preset color ie. Namely, by the second visual information 511, an edge region of the edit state display region 510 is shown as changed to a particular color. For example, by the second visual information 511, upper and lower edges of the edit state display region 510 can be displayed in red color. According to the present embodiment, since an input word is displayed in real time in the editing state display region 510 displayed to be adjacent to the input region 410 including character keys, the The user can easily recognize a word that is currently entered or a word editing state without moving the eyes to the output region 310. In addition, the controller 180 displays a state to output the first word. entered by a touch input on the output region 310, as the first visual information 501 ', and displays a state for erasing the first word 501 which has outputted on the edit state display region 510 as the second visual information 511. Accordingly, the user can visually recognize a state in which a word is edited by a touch input. Thus, the user can recognize whether a word (and content) is input as intended with an applied touch, without having to move the eyes to the output region. At the same time, referring to FIGS. 3B (a) to 3B (c), the touch screen 151 can output only the input region 410 and the output region 310, before a touch input is applied to the input region 410. In this case, the controller 180 can control the touch screen 151 to output the edit state display region 510 between the input region 410 and the output region 310 based on a touch input applied to the input region 410. In this case, a size of at least one of the input region 410 and the output region 310 can be adjusted. Based on a touch input applied to a character key 412 of the input region 410, the controller 180 can control the touch screen 151 to output a word corresponding to the applied touch input character key 412. on the edit state display region 510, and simultaneously output the word on the output region 310. Namely, the touch screen 151 may output the word corresponding to the character key 412 on the output region 310 and the real-time editing status display region 510. In this case, a control method of outputting the word corresponding to the character key 412 to the output region 310 and the real-time editing state display region 510 or a control method of outputting preferentially the word corresponding to the character key 412 only on the editing state display region 510 as described above with reference to Figs. 3A (a) to (d) may be changed according to a user selection .
[0035] Figs. 4A to 4F (c) are conceptual views illustrating a control method of changing an input position in an edit state display region. When the input function of a word is executed and a user touch input is applied, the touch screen 151 may display an indicator indicating a position to which a word is to be entered, in a region. The user can enter characters in an order based on a touch input and can add a new character between characters. Hereinafter, there will be described a control method of changing an input position when a word is output by applying a touch according to various embodiments. Referring to FIG. 4A, the touch screen 151 outputs the edit state display region 510 including an input character between the output region 310 and the input region 410. When the word is outputting on the edit state display region 510, and when a touch input is applied to the edit state display region 510, the controller 180 controls the touch screen 151 to output a cursor 520 on a region corresponding to the touch input.
[0036] For example, when a touch input is applied between two characters, the controller 180 may display the cursor 520 between the characters. Also, when the slider 520 is out, the controller 180 controls the touch screen 151 to output an additional character corresponding to a touch input applied to the input region 410, on the region where the slider 520 is positioned. Also, based on a continuous touch input applied to the edit state display region 510, the controller 180 can change the position of the cursor 520. The user can apply a touch input to the region edit status display 510 to designate a position between characters to enter an additional character. At the same time, when a touch input for forming the slider 520 or moving the slider 520 is applied to the edit state display region 510, the controller 180 controls the touch screen 151 to output third visual information 512. Here, the third visual information 512 may have a pre-established shape and color of the differentiated editing state display region 510 of the second visual information 511. Accordingly, when the third information visual 512 are output, the user can verify that an input position is designated and changed by a touch input. For example, the second visual information 511 can be implemented as a line image forming partial edges of the editing state display region 510 displayed in red color, and third visual information 512 can be implemented as an image. with lines forming a partial edge of the editing state display region 510. Namely, the user can recognize the current edit state when he sees the color change of the edges of the region of the editing area. At the same time, when a word is output on the edit state display region 510 based on a touch input applied to a character key 412, the output state display 510 is displayed. The controller 180 may display a portion of the edges of the edit state display region 510 as a blue line. Namely, the user can easily recognize when a character is entered, when a character is erased, and when an input position is changed, based on the edge color of the display region of edit state 510. At the same time, the controller 180 may output the first word 501 output by a touch input applied to the character key 412, also to the output region 310 in real time with the input touch. In this case, the controller 180 may output the cursor 520 together with the first word 501 outputted on the output region 310. Namely, the touch screen 151 may display an output state output in the region of edit state display 510, also in output region 310 simultaneously. There will be described a control method of changing an input position on the edit state display region 510 by a touch input applied to the input region 410 with reference to Fig. 4B. Based on a particular touch input applied to the input region 410, the controller 180 can change the input region 410 to a keypad region 410 '. For example, when the keypad region 410 'is enabled, the touch screen 151 may output the virtual keyboard as a semi-transparent image. Namely, when the input region 410 is changed to the keypad region 410 ', the controller 180 limits the reception of a touch input applied to a plurality of keys forming the virtual keyboard. The particular touch input may include a pre-established region and a preset type of touch. For example, the particular touch input may correspond to a long touch applied to the first edit key 411a. When the particular touch input is applied, the controller 180 changes the input region 410 to the keypad region 410 ', and controls the touch screen 151 to change the shape of the virtual keyboard 410. For example, the touch screen 151 may adjust a color or transparency of the virtual keyboard 410. The keypad region 410 'receives a continuous touch input from the user, and based on the continuous touch input, the control 180 15 can change an input position of the edit state display region 510. By and large, the input position is positioned on the right side of the word, and the input position can be moved to the left based on a touch input that moves to the left. Similarly, when a touch input is applied to the keypad region 410 ', the controller 180 controls the touch screen 151 to output the third visual information 512 to the status display region of the display. 510 edition. Here, the long touch applied to the first edit key 411a can be set as a first touch and the continuous touch input to change the input position can be defined as a second touch. The first and second touch inputs can correspond to the continuous touch input. Namely, after the first touch is applied, when the touch is released from the touch screen 151, the controller 180 can change the keypad region 410 'to the input region 410. When the second touch is released, the controller 180 may change the keypad region 410 'to the input region 410 and enter an additional character at an input position designated by the second touch. A control method of outputting a modified word when an input word includes a typographical error (shell) will be described with reference to Fig. 4C. The controller 180 may enter a word at the designated input position. When the first word 501 is displayed in the edit state display region 510, the controller 180 determines whether the spelling of the first word 501 is correct. Namely, the controller 180 can determine if the first word 501 exists in a dictionary. When the first word 501 is a typographical error, the control member 180 controls the touch screen 151 to modify the first word in a modified word 501 'and outputs it. The first modified word 501 'may include words substantially identical to those of the first word 501, but is a different form thereof. For example, the first modified word 501 'can be implemented to have a different font, font style and color or to be animated on the first modified word 501'. Thus, the user can easily recognize that the input word is a typographical error.
[0037] A control method of correcting a word in the edit state display region 510 with reference to Figs. 4D (a) to 4D (d) will be described. The edit state display region 510 (FIG. 4D (a)) exits the third visual information 512. When the third visual information 512 is output, the control member 180 can designate an input position allowing enter an additional word based on a user touch input. Also, the touch screen 151 may output a cursor indicating an input position together with the word 501 on the edit state display region 510. When the cursor is moved between characters, the controller 180 controls the touch screen 151 to output a write window 411 'on the keypad region 410'. Also, in order to erase the character in the keypad region 410 ', the controller 180 controls the touch screen 151 to output an input cancel key 412' to receive a touch input. For example, when the virtual keyboard 410 moves to the keypad region 410 ', the controller 180 limits the input of a plurality of function keys and a plurality of characters included in the virtual keyboard. However, when the keypad region passes to a write window, the controller 180 activates the input cancel key 412 'to receive a user touch input.
[0038] At the same time, when a touch input is applied to the write window 411 ', the controller 180 controls the touch screen 151 to output an image based on the touch input. For example, the controller 180 controls the touch screen 151 to analyze a touch input applied to the write window 411 ', parse a similar word, and output a word matching the image on the display region of the display. As a result, when at least one of the characters is entered incorrectly, while entering characters, the user can simply add a character, without having to delete or edit the whole characters.
[0039] Referring to Figs. 4D (a) and (d), when the word outputted on the edit state display region 510 is misspelled, the controller 180 controls the touch screen 151 to exit. recommendation words 413a and 413b related to the word. The touch screen 151 exits the first and second recommendation words 413a and 413b arranged on both sides of the first word 501. The recommendation words can include words having a meaning in the dictionary. Also, in this case, at least one of the input cancel key 412 'and the write window 411' can be output together with the first and second recommendation words 413a and 413b. A control method will be described consisting of editing an output word output on the output region 310 with reference to FIG. 4E. Characters based on the input region 410 are sequentially outputted on the output region 310. A word 501 corresponding to the character keys of the input region 410 to which touch inputs have been applied is outputted to the region of the input region 410. At the same time, based on a particular touch input applied to the first edit key 411a, the controller 180 passes the input region 410 to the numeric keypad 410 '. Based on a continuous touch input applied to the keypad region 410 ', the controller 180 can progressively move the cursor 520 displayed on the right side of the first word 501 to the left of the first word 501. When the continuous touch input is continuously applied, the controller 180 can sequentially output the previously entered word. Referring to Fig. 4E, when a second word 502 (SOY) is output on the output region 310 and the first word 501 is output on the edit state display region 510, and when touch inputs are continuously applied to the keypad region 410, the second word 502 is output to the edit state display region 510. Also, in this case, the controller 180 can control the the touch screen 151 for outputting the third visual information 512. When the first word 501 is changed to the second word 502 in the editing state display region 510, the control member 180 can extract third and fourth recommendation words 531a and 531b with respect to the second word 502, and controls the touch screen 151 to output the third and fourth recommendation words 531a and 53b. Here, the first to fourth recommendation words 413a, 413b, 531a, and 53b can be output regardless of whether the word displayed in the edit state display region 510 is correctly or incorrectly spelled. That is, when a word is related to the word displayed in the edit state display region 510, the word can be selected as a recommendation word. Accordingly, the user can sequentially check the words outputted on the output region 310 by applying touch inputs to the input region 410, without having to apply touch inputs to the output region 310 or having to move the eyes . A control method of changing an input position on the output region 310 will be described with reference to Figs. 4F (a) to 4F (c). Referring to Figure 4F (a), the touch screen 151 outputs words in a plurality of rows. "GOOD MORNING" came out in the first row, and "SAY" came out in the second row. Based on the touch input applied to the first edit key 411a, the controller 180 passes the input region 410 to the keypad region 410 '. Based on a continuous touch input from a touch input applied to the first edit key 411a and moving up and down on the touch screen 51, the controller 180 may change rows of entry positions. Here, the up / down direction corresponds to a direction of the length of the mobile terminal 100. Referring to Figs. 4F (a) and 4F (b), based on a touch input in a downward direction, the The controller 180 may change an input position to a new row in the input region 310. The slider 520 may be displayed in a new row. Referring to FIGS. 4F (a) and 4F (c), based on a touch input applied in an upward direction, the controller 180 can change the input position to between "GOO" and "GOO". O "in the input region 310, and control the touch screen 151 to exit the cursor 520.
[0040] When an input position is designated between word characters, the controller 180 may control the touch screen 151 to output the word to the edit state display region 510. In accordance with the present invention. embodiment, when the user wants to enter a word or edit a word in the middle of a long sentence created in a plurality of rows, the user can designate an input position without having to apply a touch input to the region Figs. 5A and 5B are conceptual views illustrating a control method of erasing a word output on the edit state display region according to various embodiments of the present invention. Referring to Fig. 5A, when a touch input is applied to the second edit key 411b corresponding to an input cancel key among the edit keys 411, the controller 180 controls the touch screen 151 for outputting the second visual information 511 on the edit state display region 510. Based on a touch input applied to the second edit key 412b, the controller 180 clears a portion of the word based on the touch input applied to the second edit key 411b. When the touch input applied to the second edit key 411b is released, the controller 180 controls the touch screen 151 to limit the output of the second visual information 511. For example, when the second visual information 511 corresponds to a color changed from the edit state display region 510, the controller 180 may control the touch screen 151 to restore the color of the edit state display region 510. Namely , the user can quickly check the edit state according to an edit key to which a touch input is applied by the second visual information displayed in the edit status display region 510, without having to move the eyes on the input region 310. A control method will be described consisting of erasing a portion of a word based on a touch input applied to a keypad region with reference to FIG. ure 5B. When a third predetermined specific type of touch is applied to the second edit key 411b corresponding to an input cancel key, among the edit keys 411, the controller 180 passes the region of entry 410 to the keypad region 410 '. Here, the third predetermined type of touch may correspond to a long touch input applied for a preset period of time (a few seconds). Also, the controller 180 controls the touch screen 151 to output the second visual information 511 to the edit state display region 510. At the same time, based on a fourth touch applied continuously with the third touch, the controller 180 selects a portion of the word. The fourth touch may correspond to a continuous touch input moving in a particular direction. For example, when the fourth touch moves in a direction to the left, at least a portion of sequential characters from right to left of the characters can be selected. The controller 180 may control the touch screen 151 to display the selected characters. For example, the touch screen 151 may block the selected characters or output the selected characters in a different font. When the third touch is released from the touch screen 151, the controller 180 controls the touch screen 151 to erase the selected characters. That is, the plurality of selected characters can be erased by the third and fourth touches. Likewise, when the third touch is released, the controller 180 passes the keypad region 410 'to the input region 410 and controls the touch screen 151 to limit an output of the second visual information 511. According to In the present embodiment, a plurality of characters may be first selected based on a touch input applied to the second edit key 411b and may be commanded to be cleared at one time. Similarly, by outputting the second visual information based on an erase control command by the second edit key 411b, the touch screen 151 may inform that the erase control command is entered, and when When a plurality of characters is selected, the touch screen 151 displays the plurality of characters to provide information about them visually. Since they are all displayed in the edit state display region 510, the user can recognize an erase state and a character to be selected to be erased in advance, without having to move eyes to the input region 310. Figs. 6 (a) to 6 (d) are conceptual views illustrating a control method of controlling an output position of a word based on a touch input in the editing status display region. Based on a touch input applied to the input region 410, the controller 180 outputs a first word 501 to the edit state display region 510 as shown in Fig. 6 (a). Referring to Figs. 6 (b) and 6 (c), based on a touch input applied to the first edit key 411a, the controller 180 outputs the first word 501 to the output region 310 and controls the touch screen 151 to terminate the output of the first word on the edit state display region 510. Referring to Figs. 6 (b) and 6 (d), based on a touch input applied at the first word 501, the controller 180 may control a position of the first word in the output region 310. In more detail, based on a fifth touch applied to the first word 501 for a predetermined period of time and a sixth touch applied continuously with the fifth touch, the controller 180 determines a region of the output region 310. The fifth touch may correspond to a long touch input applied to a region in which the first word 501 is output, and the sixth touch can match to a continuous sliding touch input from the fifth touch and moving to the output region 310. When the sixth touch is released, the controller 180 controls the touch screen to output the first word 501 to a region of the output region 310. When a plurality of characters have already been inputted to the output region 310, the controller 180 may control the touch screen 151 to output the first word 501 between the plurality of characters based on the sixth touch. When the fifth touch is applied, the control unit can control the touch screen 151 to output the fourth visual information 513 on the first word 501. For example, the fourth visual information 513 can correspond to the first word 501 having a changed form. or may be implemented as an edge image formed along the outer circumference of the first word 501, or the like.
[0041] At the same time, the touch screen 151 may output the first word 501 to move it to a region corresponding to the sixth touch. Also, when a sixth touch is applied between a plurality of characters already outputted, the positions of the plurality of characters may be temporarily changed. Accordingly, before entering a character, the user does not need to control the input of a position on the input region 310, and thus, the user can easily edit characters to output. Figs. 7A to 7E (d) are conceptual views illustrating a control method of entering a word on an input region. Referring to Fig. 7A, based on a touch input applied to the character key 412, the controller 180 controls the touch screen 151 to output the first word 501 to the status display region The control member 180 outputs fifth and sixth recommendation words 532a and 532b of the first word 501 on the edit state display region 510.
[0042] For example, the fifth and sixth recommendation words 532a and 532b may be arranged on both sides of the first word 501, and fonts of the fifth and sixth recommendation words 532a and 532b may be implemented to be different from the font of the first word 501. Sizes of the fifth and sixth recommendation words 532a and 532b may be smaller than that of the first word 501. The fifth and sixth recommendation words 532a and 532b may be output when the first word 501 is recognized as a typographical error, but the present invention is not limited thereto. Based on a touch input applied to the first edit key 411a among the edit keys 411, the controller 180 may control the touch screen 151 to output the first word 501, on the output region 310 The controller 180 controls the touch screen 151 to output the first visual information 501 '. Namely, the first visual information 501 'can be formed as an animation image moving from the editing state display region 510 to the output region 310. At the same time, based on a touch input applied to the first edit key 411a, the controller 180 controls the touch screen 151 to limit the output of the fifth and sixth recommendation words 532a and 532b. That is, when the output on the output region 310 is completed, characters that have been output on the edit state display region 510 disappear. There will be described a control method of outputting a recommendation word on the output region with reference to Figs. 7B (a) to 7B (c). Referring to Fig. 7B (a), when the first word 501 is output to the edit state display region 510, the controller 180 outputs the fifth and sixth recommendation words 532a and 532b of the first word 501 on the editing status display region 510. The fifth and sixth recommendation words 532a and 532b are displayed on both sides of the first word 501. The input region 410 can be divided into first and second second regions 410a and 410b for receiving a continuous touch input in a predetermined direction. The first and second regions 410a and 410b are formed in regions adjacent to the fifth and sixth recommendation words 532a and 532b, respectively. Based on a touch input applied to the first region 410a or the second region 410b, the controller 180 controls the touch screen 151 to output a recommendation word corresponding to the first region 410a or the second region 410b. on the output region 310. For example, based on a touch input first applied to the second region 410b, through the sixth recommendation word 532b, and applied to a region of the output region 310, the controller 180 may display the sixth recommendation word 532b on the output region 310. The touch input may correspond to a sliding type touch input or a type touch input with a slight stroke. However, the touch input may not be applied repeatedly to the first and second regions 410a and 410b. At the same time, when the touch input is applied and a recommendation word output on the output region 310 is selected, the controller 180 controls the touch screen 151 to output the fifth visual information 514 into the display. edit status display region 510. The fifth visual information 514 may be implemented in a form in which at least one region of the edit state display region 510 is changed. That is, based on the fifth visual information 514, the user can recognize that a word in the edit state display region 510 is output on the output region 310. For example, when the fifth visual information 514 is displayed in a region in which the selected recommendation word is displayed in the editing status display region 510, the user can easily recognize the word to be selected to be outputted to the output region 310 Likewise, before the recommendation word is output on the output region 310, an animation image in which the recommendation word moves from the edit state display region 510 to the output region 310 can be displayed.
[0043] So far, the tactile input through the sixth recommendation word 532b has been described, but the present invention is not limited thereto. For example, when a continuous touch input (light touch type input) applied to the first region 410a or the second region 410b is detected, the controller 180 may output the fifth recommendation word 532a and the sixth word Recommendation 532b on the output region 310. Here, however, the touch input applied to the first region 410a or the second region 410b may be applied in an upward direction in the touch screen 151, namely in one direction. close to the input region 310. A method of outputting a recommendation word based on a touch input applied to the first region 410a or the second region 410b will be described with reference to Fig. 7C. The touch screen 151 according to the present embodiment outputs the first word 501 based on a touch input applied to the character key 412, on the edit state display region 510 and the output region 310. substantially at the same time.
[0044] The controller 180 outputs the fifth and sixth recommendation words 532a and 532b with respect to the first word 501 on the edit state display region 510, and divides the input region 410 into the first and second regions 410a and 410b. Based on a continuous touch input applied to the second region 410b, the controller 180 clears the first word 501 in the output region 310, and controls the touch screen 151 to output the sixth recommendation word 532b on the region in which the first word 501 is out. Namely, the user can conveniently replace the first word entered by the recommendation word.
[0045] At the same time, the controller 180 can determine whether the first word 501 includes a typographical error. When it is determined that the first word 501 includes a typographical error, the controller 180 controls the touch screen 151 to change the first word 501 in the edit state display region 510. For example, the touch screen 151 may shade the first word 501 with a preset color or may change a font of the first word 501 and output it. Also, the controller 180 may control the touch screen 151 to change the first word 501 in the output region 310 so that it is substantially identical to the first word 501 in the state display region. 510 edition, and take out this one. Although a different character is output on the edit state display region 510 based on a touch input applied to the character key 412, the controller 180 may control the touch screen 151 to outputting the first word 501 in a modified form in the output region 310. Accordingly, when a recommendation word with respect to the first word 501 is output, the control unit 501 can check whether the first word 501 has an error. Typographic, and although a different word has been outputted on the edit state display region 510, a typographical error can easily be recognized in the output region 310.
[0046] Also, in the present embodiment, a continuous touch input applied to the first region 410a or the second region 410b may pass through the region in which the fifth recommendation word 532a or the sixth recommendation word 532b may not be applied. to the edit state display region 510.
[0047] A control method of providing a recommendation word with reference to Figure 7D will be described. Based on a touch input applied to the input region 410, the control unit outputs a word corresponding to the character key to which the touch input is applied, to the status display region of edition 510.
[0048] After a word is entered on the edit state display region 510, when a touch input is not applied to the input region 410 for a preset time period t, the command 180 controls the touch screen 151 to output a word of recommendation based on the word. Namely, when the predetermined period of time has elapsed, the controller 180 can search and retrieve a word recommendation word. The touch screen 151 outputs the extracted recommendation word on the edit state display region 510. However, when a touch input is applied to the input region 410 and a word is entered, the control member 180 controls the touch screen 151 to limit the output of the recommendation word. Referring to Figs. 7E (b) and 7E (c), when a third word 503 and third word recommendation words 503 are output on the edit state display region 510, the command 180 controls the touch screen 151 to output a recommendation word on the output region 310 based on a touch input continuously applied from a portion of the input region 410 to the display region d editing state 510. Referring to Figs. 7E (b) and 7E (d), based on a touch input applied to the first edit key 411a among the edit keys 411 of the region of At input 410, the controller 180 controls the touch screen 151 to output a character corresponding to a character key that has received a touch input to the input region 410, to the output region 310. Referring to Figure 7E (b), based on a touch input applied to output the displayed word in the output state display region 510 on the output region 310, the controller 180 controls the touch screen 151 to output fifth visual information 514 on at least one region of the display region the editing state 510. Namely, when a plurality of words are output on the edit state display region 510, the user can select a word to output on the output region 310 based on on various types of touch inputs.
[0049] Fig. 8 is a conceptual view illustrating a control method of clearly displaying a word based on a touch input applied to an input region in an edit state display region. Based on a touch input applied to a character key of the input region 410, the controller 180 outputs a word corresponding to the character key in real time.
[0050] When touch input is applied to the character key, the word corresponding to the character key is output in a preset font. Here, the font may include a size, a thickness and an underscore of the word, a word color, an italic font, a shadow effect, and the like. The preset word type is changed to a reference font such as that of a word already outputted by an additional touch input applied to the input region. Alternatively, the controller 180 controls the touch screen 151 to change the word to the reference font when a predetermined period of time has elapsed. For example, the previously entered word may be black in color, and the recently entered word may have an increased size and be highlighted in blue color. As a result, the user can easily recognize the word output by the recently applied tactile input. Figs. 9A-9D (d) are conceptual views illustrating screen information outputted to an input region. Referring to Fig. 9A, when an application is executed, the touch screen 151 outputs a first run screen 301 and the input region 410, and outputs a word on the display region of the display. edition 510 based on a touch input applied to the input region 410. The first run screen 301 is displayed in the output region 310. The first run screen 301 includes a first output window 321 output a word based on a touch input applied to the input region 410. For example, when a message application is executed, a user send message input is output to the first output window 321 Also, a first image 322 to be sent together with the sending message may be displayed in the first output window 321. Referring to Fig. 9A, based on a touch input also applied to the first image 322 , the organ of comm ande 180 outputs the image to the output region 310. The controller 180 controls the touch screen 151 to magnify the first image 322 and output the first image 322 magnified according to the size of the output region 310. at the same time, the controller 180 controls the touch screen 151 to output the first run screen 301 to the input region 410. The first run screen 301 is output to overlap the virtual keyboard. The touch screen 151 outputs the first run screen 301 and the virtual keyboard in a semi-transparent state. At the same time, when the first run screen 301 is output to the input region 410, the controller 180 controls the touch screen 151 to output a first graphics image 540 to receive a touch input for change an output position of the first run screen 301. Referring to Fig. 9A, when a touch input is applied to the graphic image 540, the controller 180 again outputs the first run screen 301 on the output region 310. In this case, the controller 180 may control the screen 151 to output the first image 322 so that it overlaps the virtual keyboard. Based on a touch input applied to the input region 410, the controller 180 controls the touch screen 151 to enter a word on the first output window 322. Accordingly, the user can recognize an image to send, and output a run screen and an image to a desired region, whereby the user can recognize desired information, while the virtual keyboard for entering an image is being outputted on the Limited touch screen 151. A control method will be described consisting in outputting a recommendation word on the input region with reference to FIGS. 9B (a) and 9B (b). Referring to Fig. 9B (a) and based on a touch input applied to the input region 410, the controller 180 controls the touch screen 151 to output a fourth word 504 to the region of interest. Also, based on a touch input applied to the first edit key 411a among the edit keys 411, the controller 180 controls the touch screen 151 to exit. the fourth word 504 on the first output window 320 of the first execution screen 301. The control member 180 extracts a first recommendation word 302 related to the fourth word 504 output on the first output window 320, and controls it. touch screen 151 for outputting the first recommendation word extract 302 on the input region 410. The virtual keyboard on the input region 410 and the first recommendation word 302 can be output to overlap each other under a semi-transparent form. The first recommendation word 302 may be a word to be substituted to correct the fourth word 504 when the fourth word 504 is misspelled or may be a sentence to replace the fourth word 504 according to an analysis result of the sentence output on the first word 504. exit window 320.
[0051] Referring to Fig. 9B (b), the controller 180 controls the touch screen 151 to output the fourth word 504 to the edit state display region 510 based on a touch input applied at the input region 410. Similarly, the controller 180 controls the touch screen 151 to output the seventh and eighth recommendation words 535a and 535b related to the fourth word 504. The seventh and eighth recommendation words 535a and 535b can be output on both sides of the fourth word 504 in the edit state display region 510. Also, the controller 180 controls the touch screen 151 to output a second recommendation word 303 relative to at the fourth word 504 on the input region 410. In this case, when a touch screen is applied to the first edit key 411a, the fourth word 504 is outputted to the output region 310. At the same time, when a di type touch input fferent from the touch input applied to the first edit key 411a is applied to the second recommendation word 303, the controller 180 controls the touch screen 151 to output the second recommendation word 303 to which the touch input has been applied to the output region 310. A control method of selectively outputting an input word and a recommendation word on an output region with reference to Figs. 9C (a) through 9C (c) will be described. Referring to Fig. 9C (a), when a plurality of words are output on the first output window 320, the controller 180 controls the touch screen 151 to output misspelled words (PHONECALL, MO. ) on the edit state display region 510. When a touch input is applied to the word output on the edit status display region 510, the control unit controls the touch screen 151 to output a second recommendation word 302 related to the selected word on the input region 410. The second recommendation word 302 and the virtual keyboard can be output to overlap each other in a semi-transparent state in the entry region 410.
[0052] When the second recommendation word 302 and the virtual keyboard are output together, the controller 180 can edit the word output on the first output window 320 or output a word on the first output window 320 based on a first touch input applied to the input region.
[0053] However, when a differentiated second key of the first touch input is applied to the second recommendation word 302, the controller 180 controls the touch screen 151 to output the second recommendation word 302 to the first window output 320. Here, the second touch input can be a long touch input applied for a preset period of time. As a result, when an erroneous word is entered on the output region, the user may have the opportunity to correct the word. Thus, the user can easily correct a word that has already been output, without having to check the output region again.
[0054] A control method of outputting a second execution screen for adding content with reference to Figures 9D (a) to 9D (d) will be described. Referring to Figure 9D (a), the first run screen 301 includes an icon for adding content to be sent together. The controller 180 executes a pre-established application based on a touch input applied to the icon. The application may be a media application including content, or the like. The controller 180 controls the touch screen 151 to output the second run screen 304 from the application to the input region 410. The second run screen 304 and the virtual keyboard may overlap with each other. the other in a semi-transparent state. Based on a touch input applied to the input region 410, the controller 180 controls the touch screen 151 to output a word on the first output window 320. Also, when a second touch input is applied at the input region 410, the controller 180 may select at least one content of the second run screen 304. The first and second touch inputs are differentiated, and the second touch input may be a long touch input. The controller 180 controls the touch screen 151 to output an image corresponding to a selected content on the first output window 320. In the present embodiment, the user does not need to exit a screen further. run the application to add content. Figs. 10A to 10C (b) are conceptual views illustrating a control method using screen information outputted to an input region. Referring to FIG. 10A, the touch screen 151 outputs the edit status display region 510 between a third run screen 305 and the input region 410. For example, the third screen of FIG. Execution 305 may be a web page including a second output window 330. The second output window 330 may be a search window in which a search word is output.
[0055] Based on a touch input applied to the input region 410, the controller 180 controls the touch screen 151 to output a word and a word-related recommendation word on the status display region. When the word (or related recommendation word) is displayed on the second output window 330, the controller 180 recognizes the word as a search word and controls the touch screen 151 to exit at least a recommendation search word related to the word on the input region 410. The recommendation search word can be output in a semi-transparent state together with the virtual keyboard. When a second touch input is applied to the recommendation search word, the controller 180 controls the touch screen 151 to output a search result screen 305 'using the recommendation search word as a search word. Namely, the recommendation search word is output so that it does not cover the third run screen 305.
[0056] Referring to Fig. 10B, when a word is entered on the second output window 330, the controller 180 controls the touch screen 151 to output a second graphical image 541. Based on a touch input applied at the second graphic image 541, the controller 180 can adjust the transparency of the input region 410.
[0057] Also, based on a touch input applied to the second graphic image 541, the controller 180 adjusts the virtual keyboard to become progressively transparent, and controls the touch screen 151 to output a function icon 306. for example, the function icon 306 may correspond to a command order for storing an output image on the web page in the memory 170 or storing an address of the web page or output a search result on a new page.
[0058] For example, based on a second touch input applied to the function icon 306, the controller 180 may perform a corresponding function. The second touch input can be a long touch input differentiated from the first touch input.
[0059] A control method of outputting content sub-information added to an output region with reference to Figures 10C (a) and 10C (b) will be described. Referring to Fig. 10C (a), the touch screen 151 outputs the second run screen 304 to the output region 310 and outputs a fourth run screen 307 together with the virtual keyboard to the input region 410. Namely, the user may be provided with the virtual keyboard and the fourth run screen 307 in the input region 410. For example, the fourth run screen 307 may be an execution screen of a memo application including characters. In this case, the user can enter a word on the web page, while viewing previously stored words. Namely, the user does not need to copy a word or run an application separately. Referring to Fig. 10C (b), the fourth run screen 307 is outputted together with the keyboard on the input region, and the fourth run screen 307 includes an image 307a of an attached file. For example, the fourth run screen 307 may be an email creation screen. Also, the controller 180 controls the touch screen 151 to output the sub information 307a of the attached file. Figs. 11A (a) to 11B (b) are conceptual views illustrating a control method of changing a size of a virtual keyboard according to a user's input state. A change control method of a size of a virtual keyboard according to a user input state will be described with reference to Figs. 11A (a) to 11B (b). Referring to Fig. 11A (a), the touch screen 151 outputs the input region 410, the output region 310, and the edit state display region 510. Referring to Fig. 11A (b), the touch screen 151 outputs first and second screen information 310a and 310b of different applications. Based on a touch input applied to any one of the first and second screen information 310a and 310b, the controller 180 outputs the input region 410 in conjunction with the first and second screen information 310a and 310b. Also, the controller 180 controls the touch screen 151 to output the edit state display region 510 so that it is adjacent to the input region 410.
[0060] In this case, the size of the input region of Fig. 11A (b) is formed to be smaller than the input region of Fig. 11A (a). That is, a length of the input region 410 shown in Fig. 11A (b) in the mobile terminal in a lengthwise direction is reduced. Therefore, the user can apply a touch input to the virtual keyboard, while simultaneously checking the first and second screen information 310a and 310b. A control method of adjusting a size of the input region will be described by detecting a region touched by the user with reference to Figs. 11B (a) and 11B (b). The controller 180 detects a region of a touch input applied to the differentiated keys of the virtual keyboard. For example, when a touch input applied by the user to a lower end portion of each key is frequently detected, the controller 180 may control the touch screen 151 to reduce the length of the input region 410. Namely, by detecting an applied touch input, the controller 180 may increase or decrease the input region 410 to allow the user to easily enter characters, or the like. Figs. 12A and 12B are conceptual views illustrating a control method of outputting notification information based on a touch input applied to an input region. A control method of outputting the notification information based on a touch input applied to an input region with reference to Figs. 12A and 12B will be described. A control method of outputting differentiated pattern vibrations corresponding to a character key and an edit key will be described based on a touch input applied to input region 410 with reference to FIG. 12A. Namely, the controller 180 can match different vibration patterns with a plurality of keys forming the virtual keyboard and control the haptic module 153 to output a vibration pattern corresponding to a key to which a touch input is applied. Thus, the user can recognize an affected key and an input character without having to precisely visualize the touch screen 151. A control method will be described consisting of going out differently from the character key forms forming the virtual keyboard by using history information of a touch input applied by the user with reference to FIG. 12B. The controller 180 may control the memory 170 to store the number of touch inputs applied to each character key. Thus, the control member 180 can output each key of the virtual keyboard in a form corresponding to the number of touch inputs. Here, the different shapes may include a color, a size, an image such as flicker and the like. As a result, the user can easily find a frequently used key. Figs. 13 (a) and 13 (b) are conceptual views illustrating a control method of analyzing information included in an input region and outputting a recommendation word. The information included in the input region may include a type of an executed application, a type of a content-based document and a text format (words or characters) already outputted on the input region 310, information about a target to receive characters, and the like. For example, when a screen for executing a message application is output on the input region 310, a name of a recipient to which the message is to be sent can be included in the execution screen. When a word is inputted, the controller 180 may control the touch screen 151 to output a recommendation word using recipient information on the edit status display region 510.
[0061] For example, when a word "got it Has entered, one can determine whether to issue an honorific title based on the information about the recipient. Accordingly, the user can quickly enter a desired word by using the recommendation word outputted on the edit state display region 510. According to embodiments of the present invention, since a display region of the editing state positioned to be adjacent to the input region and outputting a word based on a touch input applied to the input region is output, the user can immediately check an output word based on the touch input without moving the eyes to the output region.
[0062] Also, since the display unit outputs different visual information according to an edited state of the word, an edit state according to the user's touch input can be checked. In addition, since a plurality of content items for entering the word is output to overlap the virtual keyboard, there is no need to change the current page to check for additional screen information. The mobile terminal according to the embodiments of the present invention is not limited in its application of configurations and methods, but all or part of the embodiments can be selectively combined to be configured in various modifications. The foregoing embodiments and advantages are merely illustrative and should not be construed as limiting the present invention. The present teachings can easily be applied to other types of apparatus. This description is intended to be illustrative, not to limit the scope of the claims. Many variations, modifications and variations will occur to those skilled in the art. The features, structures, methods or other features of the embodiments described herein may be combined in various ways to provide additional and / or alternative embodiments. Since the present features may be embodied in various forms without departing from their features, it should also be understood that the embodiments described above are not limited to any of the details of the foregoing description unless otherwise specified, but should rather to be broadly defined within its scope as defined in the appended claims, and therefore any changes and modifications which fall within the scope and limits of the claims, or equivalents of such boundaries and boundaries are therefore intended to be encompassed by the appended claims . Of course, the invention is not limited to the embodiments described above and shown, from which we can provide other modes and other embodiments, without departing from the scope of the invention. .
权利要求:
Claims (15)
[0001]
REVENDICATIONS1. A mobile terminal (100) comprising: a wireless communication (110) configured to provide wireless communication, a touch screen; and a controller (180) configured to display on the touch screen an input region (410) including a plurality of character keys, a plurality of edit keys (411a, 411b, 421a, 421b) and an output region (310), displaying on the touch screen an edit state display region (510) between the input region (410) and the output region (310) and for the displaying a word corresponding to affected character keys, selecting or editing the word displayed in the edit state display region (510) based on a touch input applied to the input region (410) ), and display the selected or edited word on the output region (310).
[0002]
The mobile terminal (100) of claim 1, wherein the plurality of edit keys (411a, 411b, 421a, 421b) include a first edit key (411a), and wherein in response to receiving a first touch of the first edit key (411a), the controller (180) is further configured to display the word in the output region (310).
[0003]
The mobile terminal (100) according to claim 2, wherein the controller (180) is further configured to: activate the input region (410) in a keypad mode based on a second touch applied to the first edit key (411a) in which the second touch is different from the first touch, and change an input position to enter the word in the edit state display region (510) based on on a touch input applied to the input region (410).
[0004]
The mobile terminal (100) according to any one of claims 1 to 3, wherein when a plurality of rows of words are displayed in the output region (310), the controller (180) is further configured to select a row of a word in which the input position is to be designated based on the second touch.
[0005]
The mobile terminal (100) of claim 3, wherein the plurality of edit keys (411a, 411b, 421a, 421b) further include a second edit key for deleting the word in the region of an edit status display (510), and wherein the controller (180) is further configured to select a portion of a plurality of characters of the word based on the second touch applied to the second key editing, and erase the selected portion of the plurality of characters in response to a release of the second touch applied to the second edit key.
[0006]
The mobile terminal (100) according to any of claims 1 to 5, wherein the controller (180) is further configured to display a recommendation word including a portion of the word in the display region of the display. editing state (510).
[0007]
The mobile terminal (100) of claim 6, wherein when the word includes a typographical error, the controller (180) is further configured to display the recommendation word.
[0008]
The mobile terminal (100) according to claim 6, wherein when first and second words are displayed on both sides of the word in the editing status display region (510), the controller ( 180) is further configured to divide the input region (410) into first and second regions corresponding to the first and second words, and wherein based on a continuous touch applied to the first region or second region, the controller (180) is further configured to display the first word or the second word in the output region (310).
[0009]
The mobile terminal (100) of claim 8, wherein the controller (180) is further configured to: display the word in the output region (310) and the status display region thereof. editing (510) based on a touch input applied to a corresponding character key, and wherein based on a touch applied to the first region or the second region, the controller (180) is further configured to replace the word in the output region (310) with the first word or the second word.
[0010]
The mobile terminal (100) according to any one of claims 1 to 5, wherein the controller (180) is further configured to: display various visual information on the status display region of editing (510) based on a function of an affected edit key.
[0011]
The mobile terminal (100) of claim 10, wherein the visual information corresponds to a first image in which the word is moved and displayed in the output region (310), when the word is displayed in the output region. (310), wherein the visual information corresponds to a second image forming edges of the editing state display region (510) and having a first color when the word is erased, and wherein the visual information correspond to a third image forming the edges of the editing state display region (510) and having a second color when an input position for entering the word is changed in the state display region editing (510). 25
[0012]
The mobile terminal (100) of claim 11, wherein when the word is displayed in the edit state display region (510) based on a touch input applied to the character key, the The controller (180) is further configured to display a newly entered word with a font different from a font of a previously entered word. 30
[0013]
The mobile terminal (100) according to claim 1, wherein the output region (310) corresponds to an execution screen of an application including a plurality of content items, andin which the control member (180) ) is further configured to: display at least a portion of the plurality of content items on the input region (410) based on a touch input applied to the execution screen, and adjust a transparency of the plurality of character keys and the plurality of edit keys (411a, 411b, 421a, 421b).
[0014]
The mobile terminal (100) of claim 13, wherein the controller (180) is further configured to display a content-related graphic image on the input region (410) to control the application.
[0015]
A method of controlling a mobile terminal (100), the method comprising: displaying, via a touch screen of the mobile terminal, an input region (410) including a plurality of character keys and a plurality editing keys (411a, 411b, 421a, 421b) and an output region (310); displaying on the touch screen an edit state display region (510) between the input region (410) and the output region (310) and for displaying a word corresponding to affected character keys; selecting or editing, via a controller (180) of the mobile terminal, the word displayed in the edit state display region (510) based on a touch input applied to the region of input (410); and displaying the selected or edited word on the output region (310).
类似技术:
公开号 | 公开日 | 专利标题
FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
EP2988201A1|2016-02-24|Mobile terminal and method of controlling the same
FR3026201A1|2016-03-25|
FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3022649A1|2015-12-25|
CN105808137B|2020-10-27|Mobile terminal and control method thereof
FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3022367A1|2015-12-18|
US20170025122A1|2017-01-26|Mobile terminal and controlling method thereof
US9841891B2|2017-12-12|Mobile terminal and method of controlling the same
FR3019665A1|2015-10-09|
FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021425A1|2015-11-27|
FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021135A1|2015-11-20|
US20160054567A1|2016-02-25|Mobile terminal, glasses-type terminal, and mutual interworking method using screens thereof
FR3046470B1|2019-11-08|MOBILE TERMINAL
FR3022648A1|2015-12-25|
FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
同族专利:
公开号 | 公开日
US20150331605A1|2015-11-19|
EP2945048B1|2018-04-11|
CN105100389B|2019-06-07|
CN105100389A|2015-11-25|
EP2945048A2|2015-11-18|
KR102177607B1|2020-11-11|
KR20150131838A|2015-11-25|
FR3021136B1|2018-09-07|
EP2945048A3|2016-03-30|
US10466897B2|2019-11-05|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
WO2007037808A1|2005-09-16|2007-04-05|Apple Inc.|Virtual input device placement on a touch screen user interface|
US20140109016A1|2012-10-16|2014-04-17|Yu Ouyang|Gesture-based cursor control|
US6396520B1|2000-01-05|2002-05-28|Apple Computer, Inc.|Method of transition between window states|
US20090040184A9|2001-10-04|2009-02-12|Infogation Corporation|Information entry mechanism|
US7536650B1|2003-02-25|2009-05-19|Robertson George G|System and method that facilitates computer desktop use via scaling of displayed objects with shifts to the periphery|
US7358962B2|2004-06-15|2008-04-15|Microsoft Corporation|Manipulating association of data with a physical object|
US7515135B2|2004-06-15|2009-04-07|Research In Motion Limited|Virtual keypad for touchscreen display|
GB0505941D0|2005-03-23|2005-04-27|Patel Sanjay|Human-to-mobile interfaces|
US8074172B2|2007-01-05|2011-12-06|Apple Inc.|Method, system, and graphical user interface for providing word recommendations|
US8059101B2|2007-06-22|2011-11-15|Apple Inc.|Swipe gestures for touch screen keyboards|
WO2010035574A1|2008-09-29|2010-04-01|シャープ株式会社|Input device, input method, program, and recording medium|
US8584031B2|2008-11-19|2013-11-12|Apple Inc.|Portable touch screen device, method, and graphical user interface for using emoji characters|
CN102405456A|2009-02-04|2012-04-04|无钥启动系统公司|Data entry system|
WO2010089740A1|2009-02-04|2010-08-12|Benjamin Firooz Ghassabian|Data entry system|
US8739055B2|2009-05-07|2014-05-27|Microsoft Corporation|Correction of typographical errors on touch displays|
EP2545426A4|2010-03-12|2017-05-17|Nuance Communications, Inc.|Multimodal text input system, such as for use with touch screens on mobile phones|
KR20120038669A|2010-10-14|2012-04-24|김우찬|Korean alphabet arrangement structure of touch screen terminal|
JP5782699B2|2010-10-15|2015-09-24|ソニー株式会社|Information processing apparatus, input control method for information processing apparatus, and program|
US8587547B2|2010-11-05|2013-11-19|Apple Inc.|Device, method, and graphical user interface for manipulating soft keyboards|
US8754860B2|2010-11-05|2014-06-17|Apple Inc.|Device, method, and graphical user interface for manipulating soft keyboards|
JP2013033330A|2011-08-01|2013-02-14|Sony Corp|Information processing device, information processing method, and program|
US20130067411A1|2011-09-08|2013-03-14|Google Inc.|User gestures indicating rates of execution of functions|
US20130104068A1|2011-10-20|2013-04-25|Microsoft Corporation|Text prediction key|
WO2013067618A1|2011-11-09|2013-05-16|Research In Motion Limited|Touch-sensitive display method and apparatus|
US8904309B1|2011-11-23|2014-12-02|Google Inc.|Prediction completion gesture|
US9557913B2|2012-01-19|2017-01-31|Blackberry Limited|Virtual keyboard display having a ticker proximate to the virtual keyboard|
US9733707B2|2012-03-22|2017-08-15|Honeywell International Inc.|Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system|
US20130285927A1|2012-04-30|2013-10-31|Research In Motion Limited|Touchscreen keyboard with correction of previously input text|
EP2703957B1|2012-08-31|2018-06-06|BlackBerry Limited|Method to select word by swiping capacitive keyboard|
US20140063067A1|2012-08-31|2014-03-06|Research In Motion Limited|Method to select word by swiping capacitive keyboard|
US9021380B2|2012-10-05|2015-04-28|Google Inc.|Incremental multi-touch gesture recognition|
US8850350B2|2012-10-16|2014-09-30|Google Inc.|Partial gesture text entry|
US8584049B1|2012-10-16|2013-11-12|Google Inc.|Visual feedback deletion|
KR101380430B1|2012-12-20|2014-04-01|주식회사 팬택|Portable terminal for providing convenience information during call and method for providing convenience information during call in portable terminal|
US9298275B2|2013-02-04|2016-03-29|Blackberry Limited|Hybrid keyboard for mobile device|
CN104007832B|2013-02-25|2017-09-01|上海触乐信息科技有限公司|Continuous method, system and the equipment for sliding input text|
US9348429B2|2013-03-15|2016-05-24|Blackberry Limited|Method and apparatus for word prediction using the position of a non-typing digit|
US8887103B1|2013-04-22|2014-11-11|Google Inc.|Dynamically-positioned character string suggestions for gesture typing|
US10255267B2|2014-05-30|2019-04-09|Apple Inc.|Device, method, and graphical user interface for a predictive keyboard|US8232973B2|2008-01-09|2012-07-31|Apple Inc.|Method, device, and graphical user interface providing word recommendations for text input|
US10255267B2|2014-05-30|2019-04-09|Apple Inc.|Device, method, and graphical user interface for a predictive keyboard|
US20160077735A1|2014-09-17|2016-03-17|Kabushiki Kaisha Toshiba|Character input apparatus and character input method|
US10061509B2|2014-10-09|2018-08-28|LenovoPte. Ltd.|Keypad control|
CN105491184B|2014-10-13|2018-09-25|富泰华工业(深圳)有限公司|Cell-phone cover with keyboard|
US20160246466A1|2015-02-23|2016-08-25|Nuance Communications, Inc.|Transparent full-screen text entry interface|
US10926786B2|2016-01-05|2021-02-23|Key Safety Systems, Inc.|Steering wheel with distributed sensors|
US10140017B2|2016-04-20|2018-11-27|Google Llc|Graphical keyboard application with integrated search|
US10222957B2|2016-04-20|2019-03-05|Google Llc|Keyboard with a suggested search query region|
US10305828B2|2016-04-20|2019-05-28|Google Llc|Search query predictions by a keyboard|
US10078673B2|2016-04-20|2018-09-18|Google Llc|Determining graphical elements associated with text|
US9965530B2|2016-04-20|2018-05-08|Google Llc|Graphical keyboard with integrated search features|
US20170308289A1|2016-04-20|2017-10-26|Google Inc.|Iconographic symbol search within a graphical keyboard|
US10664157B2|2016-08-03|2020-05-26|Google Llc|Image search query predictions by a keyboard|
KR102074764B1|2018-07-20|2020-02-07|네이버 주식회사|Method and system for supporting spell checking within input interface of mobile device|
KR20200098068A|2019-02-11|2020-08-20|삼성전자주식회사|Method for recommending word and apparatus thereof|
US11194467B2|2019-06-01|2021-12-07|Apple Inc.|Keyboard management user interfaces|
KR102158544B1|2019-09-30|2020-09-22|네이버 주식회사|Method and system for supporting spell checking within input interface of mobile device|
CN111638838A|2020-05-19|2020-09-08|维沃移动通信有限公司|Text editing method and device and electronic equipment|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 |
2017-05-30| PLFP| Fee payment|Year of fee payment: 3 |
2017-09-15| PLSC| Publication of the preliminary search report|Effective date: 20170915 |
2018-05-29| PLFP| Fee payment|Year of fee payment: 4 |
2019-04-10| PLFP| Fee payment|Year of fee payment: 5 |
2021-02-12| ST| Notification of lapse|Effective date: 20210105 |
优先权:
申请号 | 申请日 | 专利标题
KR20140059165|2014-05-16|
KR1020140059165A|KR102177607B1|2014-05-16|2014-05-16|Mobile terminal and method for controlling the same|
[返回顶部]